# 3.6B Parameters
Japanese Gpt Neox 3.6b Instruction Ppo
MIT
A 3.6 billion parameter Japanese GPT-NeoX model trained with Reinforcement Learning from Human Feedback (RLHF), capable of better following instructions in conversations.
Large Language Model
Transformers Supports Multiple Languages

J
rinna
3,062
71
Japanese Gpt Neox 3.6b
MIT
A Japanese GPT-NeoX model with 3.6 billion parameters, based on the Transformer architecture, trained on 312.5 billion tokens of Japanese corpus.
Large Language Model
Transformers Supports Multiple Languages

J
rinna
34.74k
99
Featured Recommended AI Models